Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Improved grey wolf optimization algorithm for constrained optimization problem
LONG Wen, ZHAO Dongquan, XU Songjin
Journal of Computer Applications    2015, 35 (9): 2590-2595.   DOI: 10.11772/j.issn.1001-9081.2015.09.2590
Abstract2195)      PDF (842KB)(1045)       Save
The standard Grey Wolf Optimization (GWO) algorithm has a few disadvantages of low solving precision, slow convergence, and bad local searching ability. In order to overcome these disadvantages of GWO, an Improved GWO (IGWO) algorithm was proposed to solve constrained optimization problems. Using non-stationary multi-stage assignment penalty function method to deal with the constrained conditions, the original constrained optimization problem was converted into an unconstrained optimization problem. The proposed IGWO algorithm was applied to solve the converted problem. In proposed IGWO algorithm, good point set theory was used to initiate population, which strengthened the diversity of global searching. Powell search method was applied to the current optimal individual to improve local search ability and accelerate convergence. Simulation experiments were conducted on the well-known benchmark constrained optimization problems. The simulation results show that the proposed algorithm not only overcomes shortcomings of the original GWO algorithm, but also outperforms differential evolution and particle swarm optimization algorithms.
Reference | Related Articles | Metrics
Hybrid cuckoo search algorithm for solving constrained chemical engineering optimization problems
LONG Wen CHEN Le
Journal of Computer Applications    2014, 34 (2): 523-527.  
Abstract514)      PDF (677KB)(459)       Save
The cuckoo search (CS) algorithm has a few disadvantages in the global searching, including slow convergence and high possibility of being trapped in local optimum. In overcome these disadvantages, a effective hybrid CS algorithm based on Rosenbrock local search and Cauchy mutation is proposed to solve constrained numerical and chemical engineering optimization problems. Firstly, good point set method was used to initiate bird nests position, which strengthened the diversity of global searching. Secondly, for the current best position, Rosenbrock local search technique is introduced to improve the convergence speed of CSA. Thirdly, a Gaussian mutation operator would be given on the global optimum of each generation, thus, the algorithm could effectively jump out of local minima. Experimental results are examined with several constrained numerical functions and chemical engineering optimization problems and the results show a promising performance of the proposed algorithm.
Related Articles | Metrics
Parameter estimation for reaction kinetics model based on composite genetic algorithm
LONG Wen JIAO Jian-jun XU Song-jin
Journal of Computer Applications    2012, 32 (06): 1704-1706.   DOI: 10.3724/SP.J.1087.2012.01704
Abstract947)      PDF (624KB)(485)       Save
Through establishing an appropriate fitness function, the parameter estimation problem for residue hydrofining reaction kinetics model was formulated as a multi-dimensional functional optimization problem, which can be solved by Composite Genetic Algorithm (CGA). Chaotic sequences design method was introduced to construct the initialization population that was scattered uniformly over the entirely search space in order to maintain the diversity. The CGA randomly combined several effective crossover strategies with some suitable mutation strategies at each generation to create new offspring individuals. The simulation results on four benchmark problems demonstrate the effectiveness and robustness of the proposed algorithm. Taking a catalytic cracking unit in oil refinery as an example, a numerical application of the parameter estimation for residue hydrofining reaction kinetics model was solved. Satisfactory results were obtained.
Related Articles | Metrics
Differential evolution algorithm with dynamically adjusting number of subpopulation individuals
XU Song-jin LONG Wen
Journal of Computer Applications    2011, 31 (11): 3101-3103.   DOI: 10.3724/SP.J.1087.2011.03101
Abstract1363)      PDF (607KB)(455)       Save
A novel parallel differential evolution (NPDE) algorithm with dynamically adjusting the number of subpopulation individuals was proposed for solving complex optimization problems. In the NPDE algorithm, the initial population was divided into three subpopulations based on the fitness values of individuals, which were employed for global and local search respectively. The number of the subpopulation was dynamically adapted according to the search phases. Different mutation strategies were used to different subpopulation respectively. It coordinated the exploitation ability and the exploration ability of algorithm. Experiments concerning various benchmark functions were designed to test the performance of the NPDE algorithm, and the results show that it can get high performance while dealing with various complex problems.
Related Articles | Metrics
Improved particle swarm optimization based on dynamic random search technique and good-point set
LIANG Xi-ming CHEN Fu LONG Wen
Journal of Computer Applications    2011, 31 (10): 2796-2799.   DOI: 10.3724/SP.J.1087.2011.02796
Abstract1144)      PDF (589KB)(596)       Save
In order to overcome the problems of poor local search and premature convergence on Particle Swarm Optimization (PSO) algorithm, an improved particle swarm optimization approach based on Dynamic Random Search Technique (DRST) and good-point set was proposed in this paper. DRST was introduced to optimize the current best position of the swarm. On the other hand, reinitialization with a good-point set manner was employed for the swarm falling into premature convergence to go out of the local optimum. Linear search in the negative gradient direction was also applied to accelerate the optimization. In the end, an experiment was given and the results show that the improved algorithm has rapid convergence, great ability of preventing premature convergence and better performance than Standard Particle Swarm Optimization (SPSO) and Dissipative Particle Swarm Optimization (DPSO).
Related Articles | Metrics